Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 7 de 7
Filtrar
Mais filtros










Base de dados
Intervalo de ano de publicação
1.
Am J Trop Med Hyg ; 2023 Aug 21.
Artigo em Inglês | MEDLINE | ID: mdl-37604476

RESUMO

Progress in malaria control has stalled in recent years. With growing resistance to existing malaria vector control insecticides and the introduction of new vector control products, national malaria control programs (NMCPs) increasingly need to make data-driven, subnational decisions to inform vector control deployment. As NMCPs are increasingly conducting subnational stratification of malaria control interventions, including malaria vector control, country-specific frameworks and platforms are increasingly needed to guide data use for vector control deployment. Integration of routine health systems data, entomological data, and vector control program data in observational longitudinal analyses offers an opportunity for NMCPs and research institutions to conduct evaluations of existing and novel vector control interventions. Drawing on the experience of implementing 22 vector control evaluations across 14 countries in sub-Saharan Africa, as well as published and gray literature on vector control impact evaluations using routine health information system data, this article provides practical guidance on the design of these evaluations, makes recommendations for key variables and data sources, and proposes methods to address challenges in data quality. Key recommendations include appropriate parameterization of impact and coverage indicators, incorporating explanatory covariates and contextual factors from multiple sources (including rapid diagnostic testing stockouts; insecticide susceptibility; vector density measures; vector control coverage, use, and durability; climate and other malaria and non-malaria health programs), and assessing data quality before the evaluation through either on-the-ground or remote data quality assessments. These recommendations may increase the frequency, rigor, and utilization of routine data sources to inform national program decision-making for vector control.

2.
Am J Trop Med Hyg ; 100(4): 889-898, 2019 04.
Artigo em Inglês | MEDLINE | ID: mdl-30793695

RESUMO

Although on-site supervision programs are implemented in many countries to assess and improve the quality of care, few publications have described the use of electronic tools during health facility supervision. The President's Malaria Initiative-funded MalariaCare project developed the MalariaCare Electronic Data System (EDS), a custom-built, open-source, Java-based, Android application that links to District Health Information Software 2, for data storage and visualization. The EDS was used during supervision visits at 4,951 health facilities across seven countries in Africa. The introduction of the EDS led to dramatic improvements in both completeness and timeliness of data on the quality of care provided for febrile patients. The EDS improved data completeness by 47 percentage points (42-89%) on average when compared with paper-based data collection. The average time from data submission to a final data analysis product dropped from over 5 months to 1 month. With more complete and timely data available, the Ministry of Health and the National Malaria Control Program (NMCP) staff could more effectively plan corrective actions and promptly allocate resources, ultimately leading to several improvements in the quality of malaria case management. Although government staff used supervision data during MalariaCare-supported lessons learned workshops to develop plans that led to improvements in quality of care, data use outside of these workshops has been limited. Additional efforts are required to institutionalize the use of supervision data within ministries of health and NMCPs.


Assuntos
Administração de Caso/normas , Confiabilidade dos Dados , Malária/diagnóstico , Software/normas , África , Análise de Dados , Instalações de Saúde , Humanos , Organização e Administração , Atenção Primária à Saúde
3.
Am J Trop Med Hyg ; 100(4): 876-881, 2019 04.
Artigo em Inglês | MEDLINE | ID: mdl-30793697

RESUMO

Rapid diagnostic tests (RDTs) are one of the primary tools used for parasitological confirmation of suspected cases of malaria. To ensure accurate results, health-care workers (HCWs) must conduct the RDT test correctly. Trained supervisors visited 3,603 facilities to assess RDT testing performance and conduct outreach training and supportive supervision activities in eight African countries between 2015 and 2017, using a 12-point checklist to determine if key steps were being performed. The proportion of HCWs performing each step correctly improved between 1.1 and 21.0 percentage points between the first and third visits. Health-care worker scores were averaged to calculate facility scores, which were found to be high: the average score across all facilities was 85% during the first visit and increased to 91% during the third visit. A regression analysis of these facility scores estimated that, holding key facility factors equal, facility performance improved by 5.3 percentage points from the first to the second visit (P < 0.001), but performance improved only by 0.6 percentage points (P = 0.10) between the second and third visits. Factors strongly associated with higher scores included the presence of a laboratory worker at the facility and the presence of at least one staff member with previous formal training in malaria RDTs. Findings confirm that a comprehensive quality assurance system of training and supportive supervision consistently, and often significantly, improves RDT performance.


Assuntos
Técnicas de Laboratório Clínico , Pessoal de Saúde/educação , Malária/diagnóstico , Competência Profissional , África Subsaariana , Instalações de Saúde , Humanos , Organização e Administração , Análise de Regressão , Reprodutibilidade dos Testes
4.
J Voice ; 30(5): 639.e11-6, 2016 Sep.
Artigo em Inglês | MEDLINE | ID: mdl-26292795

RESUMO

OBJECTIVES/HYPOTHESIS: The purpose of this study was to describe athlete belief of performance benefit when using voice during force production tasks and to determine if vocalization during effortful tasks was correlated to perception of voice impairment. The hypotheses were as follows: (1) there is a belief among athletes that voicing during high-effort tasks improves performance and (2) the use of the voice during high-effort tasks is correlated with the perception of voice impairment. STUDY DESIGN: An anonymous, online survey was used to query presence and timing of voicing during force production. The Voice Handicap Index-10 and a visual analog scale for perceived phonatory effort were included in the survey. METHODS: The data obtained from the survey were evaluated using descriptive statistics and analysis of variance. Correlation coefficients were calculated to determine if there was a relationship between voice production and the perception of voice disorder. RESULTS: Three hundred seventy-eight participants' survey responses were used in the data analyses. The results indicated that 56% of the participating athletes indicated belief that the production of voice during maximum effort provided a performance advantage. There was no correlation between individuals who used voice during force production and the perception of voice impairment. CONCLUSIONS: Findings indicated that most athletes believe that producing voice during maximum effort improves performance without a perception of voice impairment. Further research should be completed to objectively measure force production in both grunt and nongrunt trials concurrently with measures of vocal function.


Assuntos
Atletas/psicologia , Desempenho Atlético , Contração Muscular , Força Muscular , Percepção , Fonação , Qualidade da Voz , Adulto , Idoso , Feminino , Humanos , Internet , Masculino , Pessoa de Meia-Idade , Fatores de Risco , Inquéritos e Questionários , Distúrbios da Voz/etiologia , Distúrbios da Voz/fisiopatologia , Adulto Jovem
5.
Cancer ; 118(3): 788-96, 2012 Feb 01.
Artigo em Inglês | MEDLINE | ID: mdl-21720988

RESUMO

BACKGROUND: African American (AA) women experience higher breast cancer mortality than white (W) women. These differences persist even among estrogen receptor (ER)-positive breast cancers. The 21-gene recurrence score (RS) predicts recurrence in patients with ER-positive/lymph node-negative breast cancer according to RS score-low risk (RS, 0-18), intermediate risk (RS, 19-31), and high risk (RS, >31). The high-risk group is most likely to benefit from chemotherapy, to achieve minimal benefit from hormonal therapy, and to exhibit lower ER levels (intrinsically luminal B cancers). In the current study, the authors investigated racial differences in RS testing, scores, treatment, and outcome. METHODS: Tumor registry data from 3 Atlanta hospitals identified women who were diagnosed with breast cancers during 2005 through 2009. Medical record abstraction provided information on RS and other tumor/treatment factors. Statistical analyses used chi-square/exact tests and logistic regression. RESULTS: Of 2186 patients, including 1192 AA women and 992 W women, 853 women had stage I or II, ER-positive/lymph node-negative disease and, thus, were eligible for RS testing (AA = 372 [31.2%]; W = 481 [48.5%]; P < .0001); and 272 women (31.8%) received testing (AA = 76 [20.4%]; W = 196 [40.7%]; P < .0001). Tumors were distributed into the following groups according to risk: low risk (n = 133), medium risk (n = 113), and high risk (n = 26). The mean RS did not differ by race, but risk groups did (low-risk group: 46.1% vs 50% for AA women and W women, respectively; high-risk group: 15.8% vs 7.1%, respectively; P = .043). In multivariate analyses, AA race (odds ratio, 3.6) was associated independently with high risk scores. CONCLUSIONS: AA women were half as likely as W women to receive 21-gene RS testing but were 2-fold more likely to be categorized as high risk. The current data suggested that testing guidelines are not applied equivalently, testing bias may attenuate racial differences in RS, and disparate outcomes may be explained in part by differences in RS, although compliance and pharmacogenomics also may play a role.


Assuntos
Biomarcadores Tumorais/genética , Negro ou Afro-Americano/genética , Neoplasias da Mama/etnologia , Perfilação da Expressão Gênica , Recidiva Local de Neoplasia/diagnóstico , Recidiva Local de Neoplasia/etnologia , Kit de Reagentes para Diagnóstico , População Branca/genética , Adulto , Idoso , Neoplasias da Mama/genética , Neoplasias da Mama/terapia , Carcinoma Ductal de Mama/etnologia , Carcinoma Ductal de Mama/genética , Carcinoma Ductal de Mama/terapia , Carcinoma Lobular/etnologia , Carcinoma Lobular/genética , Carcinoma Lobular/terapia , Feminino , Humanos , Pessoa de Meia-Idade , Gradação de Tumores , Recidiva Local de Neoplasia/terapia , Estadiamento de Neoplasias , Receptores de Estrogênio/metabolismo , Receptores de Progesterona/metabolismo , Sistema de Registros , Resultado do Tratamento
6.
Oecologia ; 151(3): 454-63, 2007 Mar.
Artigo em Inglês | MEDLINE | ID: mdl-17106720

RESUMO

Herbivore-induced defences appear ubiquitous across most biomes and habitats. Yet the direct correlation between induced changes in host plant chemistry and the population dynamics of the herbivore remain untested in many systems. In plant-herbivore interactions in the terrestrial environment, indirect or tritrophic interactions appear a successful way in which changes in the host plant chemistry induced by prior herbivory can impact on herbivore populations via increased success of natural enemies. This set of interactions remains untested in the marine system. Here, we present work from experiments using orthogonal contrasts of plants with different prior treatments (control, mechanical damage or herbivory) and the presence or absence of herbivores on the foraging behaviour of a crab, Carcinus maenas, and a fish, Lipophrys pholis. These experiments were carried out using a novel flow-through flume, i.e. as a choice chamber supplied by turbulent water from independent cue sources. Our results show that in the Ascophyllum nodosum (plant)-Littorina obtusata (herbivore) system infochemicals from induced plants can directly influence predator foraging behaviour. L. pholis was attracted to the presence of a feeding L. obtusata, but was also more attracted to odours from herbivore-induced tissue than odours from mechanically damaged or naïve A. nodosum. C. maenas was more attracted to odours from herbivore-induced tissue compared to naïve tissue, regardless of the presence of L. obtusata. This is the first demonstration of such behavioural consequences of herbivore-induced changes in plants for marine systems.


Assuntos
Comportamento Apetitivo/fisiologia , Ascophyllum/química , Braquiúros/fisiologia , Sinais (Psicologia) , Cadeia Alimentar , Odorantes , Perciformes/fisiologia , Análise de Variância , Animais , Dinâmica Populacional
7.
J Org Chem ; 61(14): 4617-4622, 1996 Jul 12.
Artigo em Inglês | MEDLINE | ID: mdl-11667389

RESUMO

Syntheses of 5-isopropyl-1,3-cyclohexadiene and syn-5-isopropyl-2,3-dioxabicyclo[2.2.2]octane, by routes that would allow completely diastereoselective introduction of deuterium labels, are described. The reaction of the isopropyl cyclohexadiene with singlet oxygen is shown to give an endoperoxide that is derived by preferential attack on the more sterically hindered face of the diene. A possible mechanistic explanation of this result is that the attack from the less hindered face leads to "ene" reaction rather than endoperoxide formation. However, this mechanism would require that the "ene" reaction and cycloaddition proceed via a common intermediate-presumably a perepoxide.

SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA
...